A Survey: Load Balancing for Distributed File System
نویسندگان
چکیده
منابع مشابه
A Survey: Load Balancing for Distributed File System
Distributed Systems are useful for computation and storage of large scale data at dispersed location. Distributed File System (DFS) is a subsystem of Distributed System. DFS is a means of sharing of storage space and data. Servers, Storage devices and Clients are on dispersed location in DFS. Fault tolerance and Scalability are two main features of distributed file system. Performance of DFS is...
متن کاملA Survey Report on Distributed System Using Load Balancing Approach
A number of load balancing algorithm were developed in order to improve the execution of a distributed application in any kind of distributed architecture. Load balancing involves assigning tasks to each processor and minimizing the execution time of the program. In practice, it would be possible even to execute the applications on any machine of worldwide distributed system. This results in a ...
متن کاملDistributed Load Balancing for FREEDM system
The FREEDM microgrid is a test bed for a smart grid integrated with Distributed Grid Intelligence (DGI) to efficiently manage the distribution and storage of renewable energy. Within the FREEDM system, DGI provides a unique way in applying distributed algorithms to achieve economically feasible and optimal utilization and storage of alternative energy sources in a distributed fashion. The FREED...
متن کاملA Load Balancing Tool Based on Mining Access Patterns for Distributed File System Servers
In this paper we present a new web based Distributed File System server management tool to perform load balancing across multiple servers. The Distributed File System from Distributed Computing Environment (DCE DFS) is a collection of many file systems mounted onto a single virtual file system space with a single namespace. The tool is based on rule-based data mining techniques and graph analys...
متن کاملThe Hadoop Distributed File System: Balancing Portabilty
Hadoop is a software framework that supports data intensive distributed application. Hadoop creates clusters of machine and coordinates the work among them. It include two major component, HDFS (Hadoop Distributed File System) and MapReduce. HDFS is designed to store large amount of data reliably and provide high availability of data to user application running at client. It creates multiple da...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: International Journal of Computer Applications
سال: 2015
ISSN: 0975-8887
DOI: 10.5120/19536-1190